74 research outputs found

    Generalizing smoothness constraints from discrete samples

    Get PDF
    We study how certain smoothness constraints, for example, piecewise continuity, can be generalized from a discrete set of analog-valued data, by modifying the error backpropagation, learning algorithm. Numerical simulations demonstrate that by imposing two heuristic objectives — (1) reducing the number of hidden units, and (2) minimizing the magnitudes of the weights in the network — during the learning process, one obtains a network with a response function that smoothly interpolates between the training data

    The VC-Dimension versus the Statistical Capacity of Multilayer Networks

    Get PDF
    A general relationship is developed between the VC-dimension and the statistical lower epsilon-capacity which shows that the VC-dimension can be lower bounded (in order) by the statistical lower epsilon-capacity of a network trained with random samples. This relationship explains quantitatively how generalization takes place after memorization, and relates the concept of generalization (consistency) with the capacity of the optimal classifier over a class of classifiers with the same structure and the capacity of the Bayesian classifier. Furthermore, it provides a general methodology to evaluate a lower bound for the VC-dimension of feedforward multilayer neural networks. This general methodology is applied to two types of networks which are important for hardware implementations: two layer (N - 2L - 1) networks with binary weights, integer thresholds for the hidden units and zero threshold for the output unit, and a single neuron ((N - 1) networks) with binary weigths and a zero threshold. Specifically, we obtain O(W/lnL)≤ d_2 ≤ O(W), and d_1 ~ O(N). Here W is the total number of weights of the (N - 2L - 1) networks. d_1 and d_2 represent the VC-dimensions for the (N - 1) and (N - 2L - 1) networks respectively

    Non-Stationary Random Process for Large-Scale Failure and Recovery of Power Distributions

    Full text link
    A key objective of the smart grid is to improve reliability of utility services to end users. This requires strengthening resilience of distribution networks that lie at the edge of the grid. However, distribution networks are exposed to external disturbances such as hurricanes and snow storms where electricity service to customers is disrupted repeatedly. External disturbances cause large-scale power failures that are neither well-understood, nor formulated rigorously, nor studied systematically. This work studies resilience of power distribution networks to large-scale disturbances in three aspects. First, a non-stationary random process is derived to characterize an entire life cycle of large-scale failure and recovery. Second, resilience is defined based on the non-stationary random process. Close form analytical expressions are derived under specific large-scale failure scenarios. Third, the non-stationary model and the resilience metric are applied to a real life example of large-scale disruptions due to Hurricane Ike. Real data on large-scale failures from an operational network is used to learn time-varying model parameters and resilience metrics.Comment: 11 pages, 8 figures, submitted to IEEE Sig. Pro

    Capacity of Two-Layer Networks with Binary Weights

    Get PDF
    The capacity C_b of two later (N - 2L - 1) feed-forward neural networks is shown to satisfy the relation O(W/(lnW)) ≤ C ≤ O(W). Here N - 2L - 1 stands for the networks with N input units, 2L hidden units and one output unit. W is the total number of weights of the networks. The weights take only binary values and the hidden units have integer thresholds
    • …
    corecore